$R$ -linear convergence of limited memory steepest descent
نویسندگان
چکیده
منابع مشابه
R-Linear Convergence of Limited Memory Steepest Descent
The limited memory steepest descent method (LMSD) proposed by Fletcher is an extension of the Barzilai-Borwein “two-point step size” strategy for steepest descent methods for solving unconstrained optimization problems. It is known that the Barzilai-Borwein strategy yields a method with an R-linear rate of convergence when it is employed to minimize a strongly convex quadratic. This paper exten...
متن کاملA limited memory steepest descent method
The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai-Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage of th...
متن کاملHandling Nonpositive Curvature in a Limited Memory Steepest Descent Method
We propose a limited memory steepest descent method for solving unconstrained optimization problems. As a steepest descent method, the step computation in each iteration only requires the evaluation of a gradient of the objective function and the calculation of a scalar stepsize. When employed to solve certain convex problems, our method reduces to a variant of the limited memory steepest desce...
متن کاملAsymptotic Convergence of the Steepest Descent Method for the Exponential Penalty in Linear Programming
u̇(t) = −∇xf(u(t), r(t)), u(t0) = u0 where f(x, r) is the exponential penalty function associated with the linear program min{c′x : Ax ≤ b}, and r(t) decreases to 0 as t goes to ∞. We show that for each initial condition (t0, u0) the solution u(t) is defined on the whole interval [t0,∞) and, under suitable hypothesis on the rate of decrease of r(t), we establish the convergence of u(t) towards a...
متن کاملGlobal Convergence of Steepest Descent for Quadratic Functions
This paper analyzes the effect of momentum on steepest descent training for quadratic performance functions. Some global convergence conditions of the steepest descent algorithm are obtained by directly analyzing the exact momentum equations for quadratic cost functions. Those conditions can be directly derived from the parameters (different from eigenvalues that are used in the existed ones.) ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IMA Journal of Numerical Analysis
سال: 2017
ISSN: 0272-4979,1464-3642
DOI: 10.1093/imanum/drx016